Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[AIRFLOW-2817] Force explicit choice on GPL dependency #3660

Merged
merged 1 commit into from
Aug 1, 2018

Conversation

bolkedebruin
Copy link
Contributor

By default one of Apache Airflow's dependencies pulls in a GPL
library. Airflow should not install (and upgrade) without an explicit choice.

This is part of the Apache requirements as we cannot depend on Category X
software.

https://issues.apache.org/jira/browse/AIRFLOW-2817

cc @kaxil @Fokko This is required for 1.10

@bolkedebruin bolkedebruin force-pushed the fix_gpl_default branch 2 times, most recently from aa38c92 to beecd96 Compare July 28, 2018 18:20
Copy link
Member

@kaxil kaxil left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This LGTM but the K8s tests are failing

@seelmann
Copy link
Member

seelmann commented Jul 28, 2018

The text-unidecode project suggest that if possible unidecode should be used (it has better memory usage and better transliteration quality). But is that relevant to how unidecode is used by Airflow/slugify/nvd3? In other words, does usage of text-unidecode have significant disadvantage?

If not I'd suggest to set SLUGIFY_USES_TEXT_UNIDECODE as default and avoid the required user choice which makes it easier to install Airflow. It still makes sense to have GPL_UNIDECODE as optional choice.

If there is a disadvantage I'd suggest to mention them in the INSTALL and in the printed error. Otherwise users may be confused and don't know what to do (at least I am).

setup.py Outdated
and not os.getenv("SLUGIFY_USES_TEXT_UNIDECODE") == "yes"):
raise RuntimeError("By default one of Airflow's dependencies installs a GPL "
"dependency (unidecode). To avoid this dependency set "
"SLUGIFY_USES_TEXT_UNIDECODE=yes in your environment when you "
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we should set a default - most people won’t care, especially in testing/evaluating Airflow and being able to just pip install apache-airflow is a plus.

Having a warning printed about env var (does pip have a way of surfacing these more directly than printing?) is good, but otherwise just default would be my preferred approach.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Am I right in thinking that the code will already choose the best option at runtime, and this is just for what is installed at setup time? Might be worth mentioning that in a doc somewhere?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ashb If you have an idea how to do that. Can we bundle python-slugify for example? I tried that more or less but couldnt get it to work. Setting the env variable in setup.py also doesn't work for reasons stated below.

Obviously I like to set a default and make it easier for the user.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wait - where is slugify even used? I guess it's a couple of levels down? Cos the only use of it in our repo is in dev/airflow-license (and if this is the only version then I say we just force the env var to the GPL-version - which ever way round that is).

Is there some other use via another module we pull in?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah python-nvd3 pulls it in. Right.

setup.py Outdated
# See LEGAL-362
def verify_gpl_dependency():
if (not os.getenv("GPL_UNIDECODE")
and not os.getenv("SLUGIFY_USES_TEXT_UNIDECODE") == "yes"):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Prefix with AIRFLOW_?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can do

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, though these vars are used by slugify's setup.py, I understand now. Right. Hmmmmm

Thinking and looking

@bolkedebruin
Copy link
Contributor Author

@seelmann I cannot set it as the default. That's the issue. I don't like the solution much that I have now, but I cannot set the env variable within the setup. At least I couldn't.

Issue is that pip for example uses a separate environment for parsing setup.py. This does not get propagated to the parent obviously and as such it will install the GPL version.

@codecov-io
Copy link

codecov-io commented Jul 29, 2018

Codecov Report

Merging #3660 into master will decrease coverage by 0.26%.
The diff coverage is n/a.

Impacted file tree graph

@@            Coverage Diff             @@
##           master    #3660      +/-   ##
==========================================
- Coverage   77.51%   77.25%   -0.27%     
==========================================
  Files         205      205              
  Lines       15751    15751              
==========================================
- Hits        12210    12168      -42     
- Misses       3541     3583      +42
Impacted Files Coverage Δ
airflow/executors/sequential_executor.py 50% <0%> (-50%) ⬇️
airflow/utils/sqlalchemy.py 67.39% <0%> (-6.53%) ⬇️
airflow/executors/__init__.py 59.61% <0%> (-3.85%) ⬇️
airflow/www_rbac/app.py 95.55% <0%> (-2.23%) ⬇️
airflow/www/app.py 97.05% <0%> (-1.97%) ⬇️
airflow/utils/dag_processing.py 88.6% <0%> (-1.27%) ⬇️
airflow/jobs.py 81.77% <0%> (-0.97%) ⬇️
airflow/configuration.py 83.2% <0%> (-0.75%) ⬇️
airflow/models.py 88.45% <0%> (-0.13%) ⬇️
airflow/www/views.py 68.76% <0%> (-0.13%) ⬇️
... and 4 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update ed97204...4dd13d6. Read the comment docs.

@ashb
Copy link
Member

ashb commented Jul 29, 2018

Okay, I think the problem is that, by default pip will prefer to use binary-packags (wheels, .whl), and when it does this it will not execute setup.py for that module.

python-slugify is published as a wheel. So in an empty virtual env:

SLUGIFY_USES_TEXT_UNIDECODE=yes pip install python-slugify
...
Successfully installed Unidecode-1.0.22 python-slugify-1.2.5

Even though we set the env var, it still used the "default" setting. This might have been part of the problem.

We can make it use the env var by using the --no-binary flag to pip:

SLUGIFY_USES_TEXT_UNIDECODE=yes pip install python-slugify
...
Successfully installed python-slugify-1.2.5 text-unidecode-1.2

I think the docs need updating to mention that we need to pass the --no-binary=python-slugify option to pip install for the env var to have a setting.

I don't think this is truely "solveable" - if python-slugify is already installed then we (Airflow) just get what it is already installed with.

I think the best we can/should do is say in the docs "to install without any GPL do SLUGIFY_USES_TEXT_UNIDECODE=yes pip install apache-airflow --no-binary=python-slugify" - I'm not sure we can do anything else, especially not if binary wheel packages get involved (and I'd like us to start publishing them for Airflow too.)

What do Legal need us to do for this?

@ashb
Copy link
Member

ashb commented Jul 29, 2018

Looking at the linked Jira where it says "A more explicit choice on GPL is required" I don't think we can realistically get that at setup.py time -- it's sort of too late for us to have any control, and I would say, very counter to the way things usually happen in Python.

Can we satisfy this by having a section at the start of the the install docs (both INSTALL file and in the .rst's), instead of requiring something that breaks the flow and convention of python installs?

(Where did this ask come from? I didn't see it discussed on any list so I think I might be missing a mailing list subscription?)

@ashb
Copy link
Member

ashb commented Jul 29, 2018

Since the this is a few levels down, if we wanted to try vendoring we would have to vendor python-slugify, and python-nvd3 (so it doesn't pull in the "normal" slugify.

As for the mechanics of how pip install -t airflow/vendor/ python-slugify --no-deps gets us part way there, but it then gets messy because we would need to pull in python-nvd3 and propagate up all its deps other than slugify. It gets messy and is a maintenance burden. I would much rather just document the process to get a GPL-friendly version. If we can get away with it of course.

@bolkedebruin
Copy link
Contributor Author

@ashb I didn’t observe the issue you are seeing with the wheels. If I specify the right env var it will pull in text-unidecode with pip.

I don’t think we can rely on docs here. But we can bring it up on the LEGAL JIRA issue and see what they say. The one reviewing on the IPMC was pretty adamant that the default should point to
non GPL and we should do more to accomplish this. Please note that avoiding the GPL was already documented.

@ashb
Copy link
Member

ashb commented Jul 31, 2018

Charting is causing us quite the license head-ache isn't it? :(

By default one of Apache Airflow's dependencies pulls in a GPL
library. Airflow should not install (and upgrade) without an explicit choice.

This is part of the Apache requirements as we cannot depend on Category X
software.
@bolkedebruin bolkedebruin merged commit c37fc0b into apache:master Aug 1, 2018
@bolkedebruin
Copy link
Contributor Author

Will see if we can address the issue with upstream

@ashb
Copy link
Member

ashb commented Aug 1, 2018

If not I think vendoring python-nvd3 and slugify to use the non-GPL is probably the way to go.

(Or perhaps replacing python-nvd3 entirely. That's a bigger job though. https://medium.com/@Elijah_Meeks/introducing-semiotic-for-data-visualization-88dc3c6b6926 looks interesting ,but uses React (which is fine from a licensing PoV now.) Edit: If we did use this I wouldn't suggest React-ifying the whole app, just the chart part of the page itself. If that's possible.

@ashb
Copy link
Member

ashb commented Aug 1, 2018

Something about the logic isn't right - everything on Travis is failing on the env check.

I think the other failing PR just needs rebasing actually.

@verdan
Copy link
Contributor

verdan commented Aug 1, 2018

@ashb I believe we can remove the python-nvd3 entirely and use the custom javascript to render the charts using d3 and nvd3 JS libraries, just the way we are using Graph View on DAG detail page i.e., sending all the data from python and implement charts on the frontend in templates.
But as you said, it will take some time to implement on the frontend, and won't be ready for the release 1.10.

P.S: Yes, it is possible to make a part of the application/page use the React.

@ashb
Copy link
Member

ashb commented Aug 1, 2018

Yeah, this PR is the right fix for to get 1.10.0 out. We could vendor for 1.10.1, or leave it as is until 1.11 (which could then switch charting up in someway.)

bolkedebruin added a commit that referenced this pull request Aug 3, 2018
By default one of Apache Airflow's dependencies pulls in a GPL
library. Airflow should not install (and upgrade) without an explicit choice.

This is part of the Apache requirements as we cannot depend on Category X
software.

(cherry picked from commit c37fc0b)
Signed-off-by: Bolke de Bruin <[email protected]>
bolkedebruin added a commit that referenced this pull request Aug 3, 2018
By default one of Apache Airflow's dependencies pulls in a GPL
library. Airflow should not install (and upgrade) without an explicit choice.

This is part of the Apache requirements as we cannot depend on Category X
software.

(cherry picked from commit c37fc0b)
Signed-off-by: Bolke de Bruin <[email protected]>
(cherry picked from commit b39e453)
Signed-off-by: Bolke de Bruin <[email protected]>
@tedmiston tedmiston mentioned this pull request Aug 6, 2018
6 tasks
kaxil pushed a commit that referenced this pull request Aug 7, 2018
The Read the Docs build process was broken due to #3660. This PR fixes this.
bolkedebruin pushed a commit that referenced this pull request Aug 7, 2018
The Read the Docs build process was broken due to #3660. This PR fixes this.

(cherry picked from commit 8af0aa9)
Signed-off-by: Bolke de Bruin <[email protected]>
bolkedebruin pushed a commit that referenced this pull request Aug 7, 2018
The Read the Docs build process was broken due to #3660. This PR fixes this.

(cherry picked from commit 8af0aa9)
Signed-off-by: Bolke de Bruin <[email protected]>
lxneng pushed a commit to lxneng/incubator-airflow that referenced this pull request Aug 10, 2018
By default one of Apache Airflow's dependencies pulls in a GPL
library. Airflow should not install (and upgrade) without an explicit choice.

This is part of the Apache requirements as we cannot depend on Category X
software.
lxneng pushed a commit to lxneng/incubator-airflow that referenced this pull request Aug 10, 2018
The Read the Docs build process was broken due to apache#3660. This PR fixes this.
@cHYzZQo
Copy link

cHYzZQo commented Aug 27, 2018

This is such an anti-pattern IMO. Making the package install broken by default. It's also a backward incompatible change that breaks everyone's current install scripts. I'd request you reconsider. If we really can't use a GPL package by default print a warning and use the NON-GPL option. Packages shouldn't just refuse to install by default.

@cHYzZQo
Copy link

cHYzZQo commented Aug 27, 2018

another option might be to have a bracket invocation that can be used. ei:

airflow[gpl]

or

 airflow[non-gpl]

@ashb
Copy link
Member

ashb commented Aug 28, 2018

Yes, I agree that it's not a good solution, but the problem is that it's not a direct dependency, so airflow[gpl] etc won't work.

I do want to fix this issue and make the install more pythonic.

@bolkedebruin
Copy link
Contributor Author

bolkedebruin commented Aug 28, 2018

The issue needs to be fixed by either removing the dependency on python-nvd3, fixing upstream, or vendoring in python-slugify.

The choice you are describing @cHYzZQo is our preferred option but doesn’t work with package managers in the current state of the upstream package.

aliceabe pushed a commit to aliceabe/incubator-airflow that referenced this pull request Jan 3, 2019
By default one of Apache Airflow's dependencies pulls in a GPL
library. Airflow should not install (and upgrade) without an explicit choice.

This is part of the Apache requirements as we cannot depend on Category X
software.
aliceabe pushed a commit to aliceabe/incubator-airflow that referenced this pull request Jan 3, 2019
The Read the Docs build process was broken due to apache#3660. This PR fixes this.
wmorris75 pushed a commit to modmed/incubator-airflow that referenced this pull request Jul 31, 2019
author Ash Berlin-Taylor <[email protected]> 1564493832 +0100
committer wayne.morris <[email protected]> 1564516048 -0400

parent 6ef0e37
author Ash Berlin-Taylor <[email protected]> 1564493832 +0100
committer wayne.morris <[email protected]> 1564515968 -0400

parent 6ef0e37
author Ash Berlin-Taylor <[email protected]> 1564493832 +0100
committer wayne.morris <[email protected]> 1564515909 -0400

parent 6ef0e37
author Ash Berlin-Taylor <[email protected]> 1564493832 +0100
committer wayne.morris <[email protected]> 1564515887 -0400

parent 6ef0e37
author Ash Berlin-Taylor <[email protected]> 1564493832 +0100
committer wayne.morris <[email protected]> 1564507924 -0400

parent 6ef0e37
author Ash Berlin-Taylor <[email protected]> 1564493832 +0100
committer wayne.morris <[email protected]> 1564507818 -0400

parent 6ef0e37
author Ash Berlin-Taylor <[email protected]> 1564493832 +0100
committer wayne.morris <[email protected]> 1564507092 -0400

parent 6ef0e37
author Ash Berlin-Taylor <[email protected]> 1564493832 +0100
committer wayne.morris <[email protected]> 1564507071 -0400

parent 6ef0e37
author Ash Berlin-Taylor <[email protected]> 1564493832 +0100
committer wayne.morris <[email protected]> 1564507049 -0400

parent 6ef0e37
author Ash Berlin-Taylor <[email protected]> 1564493832 +0100
committer wayne.morris <[email protected]> 1564506218 -0400

parent 6ef0e37
author Ash Berlin-Taylor <[email protected]> 1564493832 +0100
committer wayne.morris <[email protected]> 1564506121 -0400

parent 6ef0e37
author Ash Berlin-Taylor <[email protected]> 1564493832 +0100
committer wayne.morris <[email protected]> 1564505391 -0400

parent 6ef0e37
author Ash Berlin-Taylor <[email protected]> 1564493832 +0100
committer wayne.morris <[email protected]> 1564504191 -0400

parent 6ef0e37
author Ash Berlin-Taylor <[email protected]> 1564493832 +0100
committer wayne.morris <[email protected]> 1564504099 -0400

[AIRFLOW-5052] Added the include_deleted param to salesforce_hook

[AIRFLOW-1840] Support back-compat on old celery config

The new names are in-line with Celery 4, but if
anyone upgrades Airflow
without following the UPDATING.md instructions
(which we probably assume
most people won't, not until something stops
working) their workers
would suddenly just start failing. That's bad.

This will issue a warning but carry on working as
expected. We can
remove the deprecation settings (but leave the
code in config) after
this release has been made.

Closes apache#3549 from ashb/AIRFLOW-1840-back-compat

(cherry picked from commit a4592f9)
Signed-off-by: Bolke de Bruin <[email protected]>

[AIRFLOW-2812] Fix error in Updating.md for upgrading to 1.10

Closes apache#3654 from nrhvyc/AIRFLOW-2812

[AIRFLOW-2816] Fix license text in docs/license.rst

(cherry picked from commit af15f11)
Signed-off-by: Bolke de Bruin <[email protected]>

[AIRFLOW-2817] Force explicit choice on GPL dependency (apache#3660)

By default one of Apache Airflow's dependencies pulls in a GPL
library. Airflow should not install (and upgrade) without an explicit choice.

This is part of the Apache requirements as we cannot depend on Category X
software.

(cherry picked from commit c37fc0b)
Signed-off-by: Bolke de Bruin <[email protected]>
(cherry picked from commit b39e453)
Signed-off-by: Bolke de Bruin <[email protected]>

[AIRFLOW-2869] Remove smart quote from default config

Closes apache#3716 from wdhorton/remove-smart-quote-
from-cfg

(cherry picked from commit 67e2bb9)
Signed-off-by: Bolke de Bruin <[email protected]>
(cherry picked from commit 700f5f0)
Signed-off-by: Bolke de Bruin <[email protected]>

[AIRFLOW-2140] Don't require kubernetes for the SparkSubmit hook (apache#3700)

This extra dep is a quasi-breaking change when upgrading - previously
there were no deps outside of Airflow itself for this hook. Importing
the k8s libs breaks installs that aren't also using Kubernetes.

This makes the dep optional for anyone who doesn't explicitly use the
functionality

(cherry picked from commit 0be002e)
Signed-off-by: Bolke de Bruin <[email protected]>
(cherry picked from commit f58246d)
Signed-off-by: Bolke de Bruin <[email protected]>

[AIRFLOW-2859] Implement own UtcDateTime (apache#3708)

The different UtcDateTime implementations all have issues.
Either they replace tzinfo directly without converting
or they do not convert to UTC at all.

We also ensure all mysql connections are in UTC
in order to keep sanity, as mysql will ignore the
timezone of a field when inserting/updating.

(cherry picked from commit 6fd4e60)
Signed-off-by: Bolke de Bruin <[email protected]>
(cherry picked from commit 8fc8c7a)
Signed-off-by: Bolke de Bruin <[email protected]>

[AIRFLOW-2895] Prevent scheduler from spamming heartbeats/logs

Reverts most of AIRFLOW-2027 until the issues with it can be fixed.

Closes apache#3747 from aoen/revert_min_file_parsing_time_commit

[AIRFLOW-2979] Make celery_result_backend conf Backwards compatible (apache#3832)

(apache#2806) Renamed `celery_result_backend` to `result_backend` and broke backwards compatibility.

[AIRFLOW-2524] Add Amazon SageMaker Training (apache#3658)

Add SageMaker Hook, Training Operator & Sensor
Co-authored-by: srrajeev-aws <[email protected]>

[AIRFLOW-2524] Add Amazon SageMaker Tuning (apache#3751)

Add SageMaker tuning Operator and sensor
Co-authored-by: srrajeev-aws <[email protected]>

[AIRFLOW-2524] Add SageMaker Batch Inference (apache#3767)

* Fix for comments
* Fix sensor test
* Update non_terminal_states and failed_states to static variables of SageMakerHook

Add SageMaker Transform Operator & Sensor
Co-authored-by: srrajeev-aws <[email protected]>

[AIRFLOW-2763] Add check to validate worker connectivity to metadata Database

[AIRFLOW-2786] Gracefully handle Variable import errors (apache#3648)

Variables that are added through a file are not
checked as explicity as creating a Variable in the
web UI. This handles exceptions that could be caused
by improper keys or values.

[AIRFLOW-2860] DruidHook: time check is wrong (apache#3745)

[AIRFLOW-2773] Validates Dataflow Job Name

Closes apache#3623 from kaxil/AIRFLOW-2773

[AIRFLOW-2845] Asserts in contrib package code are changed on raise ValueError and TypeError (apache#3690)

[AIRFLOW-1917] Trim extra newline and trailing whitespace from log (apache#3862)

[AIRFLOW-XXX] Fix SlackWebhookOperator docs (apache#3915)

The docs refer to `conn_id` while the actual argument is `http_conn_id`.

[AIRFLOW-2912] Add Deploy and Delete operators for GCF (apache#3969)

Both Deploy and Delete operators interact with Google
Cloud Functions to manage functions. Both are idempotent
and make use of GcfHook - hook that encapsulates
communication with GCP over GCP API.

[AIRFLOW-3078] Basic operators for Google Compute Engine (apache#4022)

Add GceInstanceStartOperator, GceInstanceStopOperator and GceSetMachineTypeOperator.

Each operator includes:
- core logic
- input params validation
- unit tests
- presence in the example DAG
- docstrings
- How-to and Integration documentation

Additionally, in GceHook error checking if response is 200 OK was added:

Some types of errors are only visible in the response's "error" field
and the overall HTTP response is 200 OK.

That is why apart from checking if status is "done" we also check
if "error" is empty, and if not an exception is raised with error
message extracted from the "error" field of the response.

In this commit we also separated out Body Field Validator to
separate module in tools - this way it can be reused between
various GCP operators, it has proven to be usable in at least
two of them now.

Co-authored-by: sprzedwojski <[email protected]>
Co-authored-by: potiuk <[email protected]>

[AIRFLOW-3183] Fix bug in DagFileProcessorManager.max_runs_reached() (apache#4031)

The condition is intended to ensure the function
will return False if any file's run_count is still smaller
than max_run. But the operator used here is "!=".
Instead, it should be "<".

This is because in DagFileProcessorManager,
there is no statement helping limit the upper
limit of run_count. It's possible that
files' run_count will be bigger than max_run.
In such case, max_runs_reached() method
may fail its purpose.

[AIRFLOW-3099] Don't ever warn about missing sections of config (apache#4028)

Rather than looping through and setting each config variable
individually, and having to know which sections are optional and which
aren't, instead we can just call a single function on ConfigParser and
it will read the config from the dict, and more importantly here, never
error about missing sections - it will just create them as needed.

[AIRFLOW-3089] Drop hard-coded url scheme in google auth redirect. (apache#3919)

The google auth provider hard-codes the `_scheme` in the callback url to
`https` so that airflow generates correct urls when run behind a proxy
that terminates tls. But this means that google auth can't be used when
running without https--for example, during local development. Also,
hard-coding `_scheme` isn't the correct solution to the problem of
running behind a proxy. Instead, the proxy should be configured to set
the `X-Forwarded-Proto` header to `https`; Flask interprets this header
and generates the appropriate callback url without hard-coding the
scheme.

[AIRFLOW-3178] Handle percents signs in configs for airflow run (apache#4029)

* [AIRFLOW-3178] Don't mask defaults() function from ConfigParser

ConfigParser (the base class for AirflowConfigParser) expects defaults()
to be a function - so when we re-assign it to be a property some of the
methods from ConfigParser no longer work.

* [AIRFLOW-3178] Correctly escape percent signs when creating temp config

Otherwise we have a problem when we come to use those values.

* [AIRFLOW-3178] Use os.chmod instead of shelling out

There's no need to run another process for a built in Python function.

This also removes a possible race condition that would make temporary
config file be readable by more than the airflow or run-as user
The exact behaviour would depend on the umask we run under, and the
primary group of our user, likely this would mean the file was readably
by members of the airflow group (which in most cases would be just the
airflow user). To remove any such possibility we chmod the file
before we write to it

[AIRFLOW-2216] Use profile for AWS hook if S3 config file provided in aws_default connection extra parameters (apache#4011)

Use profile for AWS hook if S3 config file provided in
aws_default connection extra parameters
Add test to validate profile set

[AIRFLOW-3138] Use current data type for migrations (apache#3985)

* Use timestamp instead of timestamp with timezone for migration.

[AIRFLOW-3119] Enable debugging with Celery(apache#3950)

This will enable --loglevel when launching a
celery worker and inherit that LOGGING_LEVEL
setting from airflow.cfg

[AIRFLOW-3197] EMRHook is missing new parameters of the AWS API (apache#4044)

Allow passing any params to the CreateJobFlow API, so that we don't have
to stay up to date with AWS api changes.

[AIRFLOW-3203] Fix DockerOperator & some operator test (apache#4049)

- For argument `image`, no need to explicitly
  add "latest" if tag is omitted.
  "latest" will be used by default if no
  tag provided. This is handled by `docker` package itself.

- Intermediate variable `cpu_shares` is not needed.

- Fix wrong usage of `cpu_shares` and `cpu_shares`.
  Based on
  https://docker-py.readthedocs.io/en/stable/api.html#docker.api.container.ContainerApiMixin.create_host_config,
  They should be an arguments of
  self.cli.create_host_config()
  rather than
  APIClient.create_container().

- Change name of the corresponding test script,
  to ensure it can be discovered.

- Fix the test itself.

- Some other test scripts are not named properly,
  which result in failure of test discovery.

[AIRFLOW-3232] More readable GCF operator documentation (apache#4067)

[AIRFLOW-3231] Basic operators for Google Cloud SQL (apache#4097)

Add CloudSqlInstanceInsertOperator, CloudSqlInstancePatchOperator and CloudSqlInstanceDeleteOperator.

Each operator includes:
- core logic
- input params validation
- unit tests
- presence in the example DAG
- docstrings
- How-to and Integration documentation

Additionally, small improvements to GcpBodyFieldValidator were made:
- add simple list validation capability (type="list")
- introduced parameter allow_empty, which can be set to False
	to test for non-emptiness of a string instead of specifying
	a regexp.

Co-authored-by: sprzedwojski <[email protected]>
Co-authored-by: potiuk <[email protected]>

[AIRFLOW-2524] Update SageMaker hook and operators (apache#4091)

This re-works the SageMaker functionality in Airflow to be more complete, and more useful for the kinds of operations that SageMaker supports.

We removed some files and operators here, but these were only added after the last release so we don't need to worry about any sort of back-compat.

[AIRFLOW-3276] Cloud SQL: database create / patch / delete operators (apache#4124)

[AIRFLOW-2192] Allow non-latin1 usernames with MySQL backend by adding a SQL_ENGINE_ENCODING param and default to UTF-8 (apache#4087)

Compromised of:

Since we have unicode_literals importred and the engine arguments must be strings in Python2 explicitly make 'utf-8' a string.

replace bare exception with conf.AirflowConfigException for missing value.

It's just got for strings apparently.

Add utf-8 to default_airflow.cfg - question do I still need the try try/except block or can we depend on defaults (I note some have both).

Get rid of try/except block and depend on default_airflow.cfg

Use __str__ since calling str just gives us back a newstr as well.

Test that a panda user can be saved.

[AIRFLOW-3295] Fix potential security issue in DaskExecutor (apache#4128)

When user decides to use TLS/SSL encryption
for DaskExecutor communications,
`Distributed.Security` object will be created.

However, argument `require_encryption` is missed
to be set to `True` (its default value is `False`).

This may fail the TLS/SSL encryption setting-up.

[AIRFLOW-XXX] Fix flake8 errors from apache#4144

[AIRFLOW-2574] Cope with '%' in SQLA DSN when running migrations (apache#3787)

Alembic uses a ConfigParser like Airflow does, and "%% is a special
value in there, so we need to escape it. As per the Alembic docs:

> Note that this value is passed to ConfigParser.set, which supports
> variable interpolation using pyformat (e.g. `%(some_value)s`). A raw
> percent sign not part of an interpolation symbol must therefore be
> escaped, e.g. `%%`

[AIRFLOW-3090] Demote dag start/stop log messages to debug (apache#3920)

[AIRFLOW-3090] Specify path of key file in log message (apache#3921)

[AIRFLOW-3111] Fix instructions in UPDATING.md and remove comment (apache#3944)

artifacts in default_airflow.cfg

- fixed incorrect instructions in UPDATING.md regarding core.log_filename_template and elasticsearch.elasticsearch_log_id_template
- removed comments referencing "additional curly braces" from
default_airflow.cfg since they're irrelevant to the rendered airflow.cfg

[AIRFLOW-3127] Fix out-dated doc for Celery SSL (apache#3967)

Now in `airflow.cfg`, for Celery-SSL, the item names are
"ssl_active", "ssl_key", "ssl_cert", and "ssl_cacert".
(since PR https://github.com/apache/incubator-airflow/pull/2806/files)

But in the documentation
https://airflow.incubator.apache.org/security.html?highlight=celery
or
https://github.com/apache/incubator-airflow/blob/master/docs/security.rst,
it's "CELERY_SSL_ACTIVE", "CELERY_SSL_KEY", "CELERY_SSL_CERT", and
"CELERY_SSL_CACERT", which is out-dated and may confuse readers.

[AIRFLOW-3187] Update airflow.gif file with a slower version (apache#4033)

[AIRFLOW-3164] Verify server certificate when connecting to LDAP (apache#4006)

Misconfiguration and improper checking of exceptions disabled
server certificate checking. We now only support TLS connections
and do not support insecure connections anymore.

[AIRFLOW-2779] Add license headers to doc files (apache#4178)

This adds ASF license headers to all the .rst and .md files with the
exception of the Pull Request template (as that is included verbatim
when opening a Pull Request on Github which would be messy)

Added the include_deleted parameter to salesforce hook
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants